iOS视频处理之增加边框 您所在的位置:网站首页 ios 视频处理 iOS视频处理之增加边框

iOS视频处理之增加边框

2024-07-07 04:11| 来源: 网络整理| 查看: 265

近来有一朋友问如何在iOS中进行视频的处理,比如增加title.增加边框,什么的.研究了一上午.现原理及代码分享如下:

参考自:https://www.raywenderlich.com/30200/avfoundation-tutorial-adding-overlays-and-animations-to-videos Github:https://github.com/sunstrider12/theVideo 复制代码

初始化: 要处理一个视频,我们首先要拿到AVAsset.然后才能对齐进行编辑. 没什么技术含量.直接上代码了.

//加载视频. - (void)loadVideo:(UIButton*)btn{     UIImagePickerController *imagePicker = [[UIImagePickerController alloc] init];     imagePicker.sourceType = UIImagePickerControllerSourceTypeSavedPhotosAlbum;     imagePicker.mediaTypes = [[NSArray alloc] initWithObjects: (NSString *) kUTTypeMovie, nil];     imagePicker.allowsEditing = YES;     imagePicker.delegate = self;     [self presentViewController:imagePicker animated:YES completion:nil]; } - (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {     NSString *mediaType = [info objectForKey: UIImagePickerControllerMediaType];     [self dismissViewControllerAnimated:YES completion:nil];     if (CFStringCompare ((__bridge_retained CFStringRef) mediaType, kUTTypeMovie, 0) == kCFCompareEqualTo) {         _videoAsset = [AVAsset assetWithURL:[info objectForKey:UIImagePickerControllerMediaURL]];         [self showMessage:@"load finish" details:nil view:self.view completionBlock:nil];     } } - (void)imagePickerControllerDidCancel:(UIImagePickerController *)picker {     [self dismissViewControllerAnimated:YES completion:nil]; } 复制代码 //以下为点击"保存"按钮之后的操作. //这里需要创建AVMutableVideoComposition以及前期准备工作.     //如果没有加载视频,直接return     if (!_videoAsset) {         [self showMessage:@"First load video" details:@"" view:self.view completionBlock:nil];         return;     }     [self showMessage:@"saving" details:@"" view:self.view completionBlock:nil];     //创建AVMutableComposition     AVMutableComposition *avMComposition = [[AVMutableComposition alloc] init];          //创建AVMutableCompositionTrack     AVMutableCompositionTrack *avMTrack = [avMComposition addMutableTrackWithMediaType:AVMediaTypeVideo                                                                         preferredTrackID:kCMPersistentTrackID_Invalid];     [avMTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, _videoAsset.duration)                         ofTrack:[[_videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]                          atTime:kCMTimeZero error:nil];          AVMutableVideoCompositionInstruction *avMComI = [AVMutableVideoCompositionInstruction videoCompositionInstruction];     avMComI.timeRange = CMTimeRangeMake(kCMTimeZero, _videoAsset.duration);          //AVMutableVideoCompositionLayerInstruction 视频追踪和定位     AVMutableVideoCompositionLayerInstruction *avMVComLayerI = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:avMTrack];     AVAssetTrack *avAssetTrack = [[_videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];     UIImageOrientation videoAssetOrientation_up  = UIImageOrientationUp;     BOOL isVideoAssetPortrait  = NO;     CGAffineTransform videoTransform = avAssetTrack.preferredTransform;     if (videoTransform.a == 0 && videoTransform.b == 1.0 && videoTransform.c == -1.0 && videoTransform.d == 0) {         videoAssetOrientation_up = UIImageOrientationRight;         isVideoAssetPortrait = YES;     }     if (videoTransform.a == 0 && videoTransform.b == -1.0 && videoTransform.c == 1.0 && videoTransform.d == 0) {         videoAssetOrientation_up =  UIImageOrientationLeft;         isVideoAssetPortrait = YES;     }     if (videoTransform.a == 1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == 1.0) {         videoAssetOrientation_up =  UIImageOrientationUp;     }     if (videoTransform.a == -1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == -1.0) {         videoAssetOrientation_up = UIImageOrientationDown;     }     [avMVComLayerI setTransform:avAssetTrack.preferredTransform atTime:kCMTimeZero];     [avMVComLayerI setOpacity:0.0 atTime:_videoAsset.duration];          avMComI.layerInstructions = [NSArray arrayWithObjects:avMVComLayerI,nil];          AVMutableVideoComposition *mainCompositionInst = [AVMutableVideoComposition videoComposition];          CGSize naturalSize;     if(isVideoAssetPortrait){         naturalSize = CGSizeMake(avAssetTrack.naturalSize.height, avAssetTrack.naturalSize.width);     } else {         naturalSize = avAssetTrack.naturalSize;     }          float renderWidth, renderHeight;     renderWidth = naturalSize.width;     renderHeight = naturalSize.height;     mainCompositionInst.renderSize = CGSizeMake(renderWidth, renderHeight);     mainCompositionInst.instructions = [NSArray arrayWithObject:avMComI];     mainCompositionInst.frameDuration = CMTimeMake(1, 30); [self addLayerWithVideoComposition:mainCompositionInst size:naturalSize]; //我们修改视频的代码就在这个函数内. //增加过Layer之后,保存     NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);     NSString *documentsDirectory = [paths objectAtIndex:0];     NSString *videoPath =  [documentsDirectory stringByAppendingPathComponent:                              [NSString stringWithFormat:@"My-change-%d.mov",arc4random() % 1000]];     NSURL *url = [NSURL fileURLWithPath:videoPath];      //视频压缩     AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:avMComposition                                                                       presetName:AVAssetExportPresetHighestQuality];     exporter.outputURL=url;     exporter.outputFileType = AVFileTypeQuickTimeMovie;     exporter.shouldOptimizeForNetworkUse = YES;     exporter.videoComposition = mainCompositionInst;     [exporter exportAsynchronouslyWithCompletionHandler:^{ [self exportDidFinish:exporter]; }]; 复制代码 - (void)exportDidFinish:(AVAssetExportSession*)session {     if (session.status == AVAssetExportSessionStatusCompleted) {         NSURL *outputURL = session.outputURL;         ALAssetsLibrary *lib = [[ALAssetsLibrary alloc] init];         if ([lib videoAtPathIsCompatibleWithSavedPhotosAlbum:outputURL]) {             [lib writeVideoAtPathToSavedPhotosAlbum:outputURL completionBlock:^(NSURL *assetURL, NSError *error){                 dispatch_async(dispatch_get_main_queue(), ^{                     if (error) {                         [self showMessage:@"ERROR" details:@"Save Error" view:self.view completionBlock:nil];                     } else {                         [self showMessage:@"OK" details:@"Save to photo" view:self.view completionBlock:nil];                     }                 });             }];         }     } } 复制代码

一.增加边框. 首先我们要明确一点.iOS中是不可能直接对视频的某一帧进行编辑的.我们只是在视频增加一个CALayer图层.播放出来的效果是一样的.具体如果要针对某一帧进行操作,那iPhone是做不了的.

图中我们可以看到在parentLayer下面有两个,一个是videoLayer,一个是animationLayer.字面意思.videoLayer是负责播放视频,那么animationLayer我们就可以自定义了.并且我们可以拥有多个animationLayer. 简单一点: ) 知晓了原理.那我们就来实现吧.给视频增加一个红色的,宽度为2的边框. 第一步:创建一个parentLayer

    CALayer *parentLayer = [CALayer layer];     parentLayer.frame = CGRectMake(0, 0, naturalSize.width, naturalSize.height); //这里的宽和高为视频的宽和高. 复制代码

第二步:创建背景Layer

    CALayer *backgroundLayer = [CALayer layer];     backgroundLayer.frame = CGRectMake(0, 0, naturalSize.width, naturalSize.height);     backgroundLayer.borderColor=[UIColor redColor].CGColor;     backgroundLayer.borderWidth=2; //看到没有.很神奇.其实就是是一个layer.在其他view中能实现的.这里全都能实现. 复制代码

第三步:创建一个videoLayer

    CALayer *videoLayer = [CALayer layer];     videoLayer.frame = CGRectMake(2, 2,naturalSize.width-(2*2), naturalSize.height-(2*2)); //这里是video在parentLayer中的frame; 复制代码

第四步:把两个layer放入parentLayer.然后通知系统.我们搞定了.让系统来显示

    [parentLayer addSublayer:backgroundLayer];     [parentLayer addSublayer:videoLayer];     mainCompositionInst.animationTool = [AVVideoCompositionCoreAnimationTool videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer]; //这句话的意思是告诉系统.parentLayer里面的videoLayer是来负责播放视频的. 复制代码

以上是增加边框的代码. 下一篇是增加title的代码,原理也是相当的简单.



【本文地址】

公司简介

联系我们

今日新闻

    推荐新闻

    专题文章
      CopyRight 2018-2019 实验室设备网 版权所有